12 research outputs found

    Approximation Strategies for Incomplete MaxSAT

    Get PDF
    Incomplete MaxSAT solving aims to quickly find a solution that attempts to minimize the sum of the weights of the unsati sfied soft clauses without providing any optimality guarantees. In th is paper, we propose two approximation strategies for improving incomp lete MaxSAT solving. In one of the strategies, we cluster the weights and approximate them with a representative weight. In another strategy, we b reak up the problem of minimizing the sum of weights of unsatisfiable clauses into multiple minimization subproblems. Experimental res ults show that approximation strategies can be used to find better solution s than the best incomplete solvers in the MaxSAT Evaluation 2017

    On Tackling the Limits of Resolution in SAT Solving

    Full text link
    The practical success of Boolean Satisfiability (SAT) solvers stems from the CDCL (Conflict-Driven Clause Learning) approach to SAT solving. However, from a propositional proof complexity perspective, CDCL is no more powerful than the resolution proof system, for which many hard examples exist. This paper proposes a new problem transformation, which enables reducing the decision problem for formulas in conjunctive normal form (CNF) to the problem of solving maximum satisfiability over Horn formulas. Given the new transformation, the paper proves a polynomial bound on the number of MaxSAT resolution steps for pigeonhole formulas. This result is in clear contrast with earlier results on the length of proofs of MaxSAT resolution for pigeonhole formulas. The paper also establishes the same polynomial bound in the case of modern core-guided MaxSAT solvers. Experimental results, obtained on CNF formulas known to be hard for CDCL SAT solvers, show that these can be efficiently solved with modern MaxSAT solvers

    On maximal frequent itemsets mining with constraints

    No full text
    International audienceRecently, a new declarative mining framework based on constraint programming (CP) and propositional satisfiability (SAT) has been designed to deal with several pattern mining tasks. The itemset mining problem has been modeled using constraints whose models cor- respond to the patterns to be mined. In this paper, we propose a new propositional satisfiability based approach for mining maximal frequent itemsets that extends the one proposed in [20]. We show that instead of adding constraints to the initial SAT based itemset mining encoding, the maximal itemsets can be obtained by performing clause learning during search. A major strength of our approach rises in the compactness of the proposed encoding and the efficiency of the SAT-based maximal itemsets enumeration derived using blocked clauses. Experimental results on several datasets, show the feasibility and the efficiency of our approac

    Interior Point Methods For Global Optimization

    No full text
    Interior point methods, originally invented in the context of linear programming, have found a much broader range of applications, including global optimization problems that arise in engineering, computer science, operations research, and other disciplines. This chapter overviews the conceptual basis and applications of interior point methods for some classes of global optimization problems

    A parallel SAT-based framework for closed frequent itemsets mining

    No full text
    International audienceConstraint programming (CP) and propositional satisfiabil- ity (SAT) based framework for modeling and solving pattern mining tasks has gained a considerable audience in recent years. However, this nice declarative and generic framework encounters a scaling problem. The huge size of constraints networks/propositional formulas encoding large datasets is identified as the main bottleneck of most existing ap- proaches. In this paper, we propose a parallel SAT based framework for itemset mining problem to push forward the solving efficiency. The pro- posed approach is based on a divide-and-conquer paradigm, where the transaction database is partitioned using item-based guiding paths. Such decomposition allows us to derive smaller and independent Boolean for- mulas that can be solved in parallel. The performance and scalability of the proposed algorithm are evaluated through extensive experiments on several datasets. We demonstrate that our partition-based parallel SAT approach outperforms other CP approaches even in the sequential case, while significantly reducing the performances gap with specialized approache

    Illinois-Specific LRFR Live-Load Factors Based on Truck Data

    Get PDF
    This research project has a focus on the load and resistance factored rating (LRFR) live-load factors for load rating bridges in Illinois. The study’s objectives were to examine the adequacy of available Illinois weigh-in-motion (WIM) data and to develop refined live-load factors for Illinois LRFR practice, based on recorded truck loads in Illinois. There are currently 20 operating WIM sites in Illinois, each next to a weigh station. Initially, only one WIM site was providing two lanes of truck-weight data simultaneously recorded, while the remaining 19 were collecting data for the driving lane only. Twolane WIM data are important for live-load factor refinement because it is the cluster events involving trucks in different lanes that induce maximum load effects in primary bridge components such as girders. Thus, such data are critical to live-load factors. Upon recommendation from this project, the capability of passing-lane recording was promptly added to two more of the 20 sites. An additional effort was made in this study to simulate the passing lane’s data for the remaining 17 sites, to maximize the use of Illinois-relevant WIM data for covering the entire state. This simulation used the probability of multiple trucks in a cluster, based on WIM data from eight states including Illinois. It also used truck-weight-demography information and headway distances of trucks in cluster from all available Illinois sites. This simulation method was tested and proven in the present project to be reliable for calibration here for Illinois. The resulting truck records of these 17 sites and those recorded at the other 3 sites capable of providing two lanes of truckweight data from 2013 to 2017 were then used to develop refined live-load factors for LRFR in Illinois. Illinois trucks are seen in these WIM data to be less severe than those weighed in Canada, which were used in calibrating the current AASHTO LRFD Bridge Design Specifications (BDS) (2017). Illinois trucks recorded in the WIM data were also found to have behaved with little or no influence from the nearby weigh station. Four load-rating cases are addressed in this project in calibrating LRFR live-load factors for Illinois: design load, legal load, routine-permit load, and special-permit load. Based on calibration using Illinois truckweight records, no change for the design load rating is recommended. Lower live-load factors are recommended for the other three cases for Illinois than those prescribed in the current MBE, by about 8% to 14%, depending on average daily truck traffic (ADTT). Illustrative examples using the recommended live-load factors have been prepared and presented in this report. It is also recommended that Illinois Department of Transportation (IDOT) continue to keep the WIM stations well-maintained, including periodical calibration of the weight sensors and systems; gather more truck-weight-data; review them at least biennially; and focus on possible growth of truck load in both magnitude and volume. When funding becomes available, passinglane recording is recommended to be added to those WIM sites that currently do not have this capability. Truck-data gathering is also recommended for sites where congested truck traffic is often observed, given adequate funding for such facilities.IDOT-R27-171Ope
    corecore